Skip to content

Adds Ollama service#250

Merged
crypt0rr merged 7 commits intotailscale-dev:mainfrom
RychidM:add-ollama-service
Apr 12, 2026
Merged

Adds Ollama service#250
crypt0rr merged 7 commits intotailscale-dev:mainfrom
RychidM:add-ollama-service

Conversation

@RychidM
Copy link
Copy Markdown
Contributor

@RychidM RychidM commented Apr 7, 2026

Pull Request Title: Add Ollama Service

Description

Adds a Tailscale sidecar configuration for Ollama, a tool for running large language models (LLMs) locally. This lets users access their local models securely from any device on their Tailnet — including phones and remote machines — without exposing the Ollama API to the public internet.

Includes:

  • compose.yaml following the ScaleTail sidecar pattern (network_mode: service:tailscale, health checks, depends_on with health condition)
  • .env template with SERVICE, IMAGE_URL, TS_AUTHKEY, and an optional OLLAMA_API_KEY
  • README.md covering prerequisites, volumes, MagicDNS/HTTPS setup, optional LAN port exposure, and first-run model pull instructions
  • Optional yourNetwork external network on the Tailscale container, enabling other containers (e.g. Open WebUI) to reach Ollama via inter-container networking

Related Issues

N/A

Type of Change

  • Bug fix
  • New feature
  • Documentation update
  • Refactoring

How Has This Been Tested?

  1. Ran docker compose config — no errors or missing variable warnings.
  2. Pre-created bind-mount directories (config, ts/state, ollama-data) and started the stack with docker compose up -d.
  3. Confirmed both containers reached healthy status via docker compose ps.
  4. Verified Tailscale node registration and Tailnet IP with docker exec tailscale-ollama tailscale status.
  5. Pulled tinyllama and sent a test generation request via curl to the Tailnet IP on port 11434 — received a valid JSON response.
  6. Repeated the curl test from a second device on the same Tailnet to confirm remote access works end-to-end.
  7. Verified tailscale-ollama appears in docker network inspect yourNetwork.

Checklist

  • I have performed a self-review of my code
  • I have added tests that prove my fix or feature works
  • I have updated necessary documentation (e.g. frontpage README.md)
  • Any dependent changes have been merged and published in downstream modules

Screenshots (if applicable)

N/A — no visual UI changes.

Additional Notes

  • The yourNetwork network is optional. Users who don't need inter-container communication can remove the networks: sections from compose.yaml entirely.
  • Ollama models can be large (several GB each). The README notes this under the Volumes section so users are aware before first pull.
  • OLLAMA_KEEP_ALIVE is set to 24h by default to keep models warm; users can adjust or remove this to suit their hardware.

@RychidM RychidM changed the title Adds Ollama service configuration with Docker Compose Adds Ollama service Apr 9, 2026
@jackspiering jackspiering added the new service request to add a new service label Apr 11, 2026
Copy link
Copy Markdown
Collaborator

@jackspiering jackspiering left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please comment the requested lines out. After that, it can be merged.

Here is a screenshot of the working container 💯

Image

@RychidM
Copy link
Copy Markdown
Contributor Author

RychidM commented Apr 12, 2026

@jackspiering I have made the requested changes, please have a look.

@jackspiering
Copy link
Copy Markdown
Collaborator

jackspiering commented Apr 12, 2026

@jackspiering I have made the requested changes, please have a look.

Thank you for the edit. Although the comment in the compose.yaml has not been edited. This could generate an error. Please comment that line out, my comnent is still open.

Besides that could you please add Ollama to the general README.md table?

Thank you!

@RychidM
Copy link
Copy Markdown
Contributor Author

RychidM commented Apr 12, 2026

@jackspiering I have made the requested changes, please have a look.

Thank you for the edit. Although the comment in the compose.yaml has not been edited. This could generate an error. Please comment that line out, my comnent is still open.

Besides that could you please add Ollama to the general README.md table?

Thank you!

Please have a look, I have edited the comment in the compose.yaml and added Ollama to the general readme table.

- Add time zone setting for containers in .env
- Improve formatting of the configuration table in README
Copy link
Copy Markdown
Collaborator

@crypt0rr crypt0rr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! ✌🏼

@crypt0rr crypt0rr dismissed jackspiering’s stale review April 12, 2026 10:53

Checked for the requested changes, these are addressed.

@crypt0rr crypt0rr merged commit 7dc347b into tailscale-dev:main Apr 12, 2026
1 check passed
@RychidM RychidM deleted the add-ollama-service branch April 12, 2026 11:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

new service request to add a new service

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants